Sparse nonlinear discriminants

نویسنده

  • Edin Andelic
چکیده

This thesis considers training algorithms for machine learning and their applications for classification, regression and automatic speech recognition. Particularly, supervised learning, which is also called learning from samples, is considered. Starting with a short introduction into statistical learning theory it is shown, that supervised learning can be formulated as a function estimation problem where the class of linear functions turns out to be an appropriate choice for obtaining solutions with high generalization ability. The performance of linear functions may then be enhanced further by the use of the so-called kernel functions, which allow effectively to transform certain algorithms into nonlinear spaces. An important advantage of kernel functions is that the estimated functions are still linear in the new nonlinear space such that the theoretical benefits of linear functions regarding the generalization ability are preserved. The discriminant approach turns out to be appropriate for being transformed into nonlinear spaces using kernel functions. We propose some order-recursive algorithms which allow to estimate a nonlinear kernel-induced version of the discriminant with a reasonable cost. These algorithms are based on two facts. First, the discriminant approach is equivalent to a certain least-squares regression. Second, the solution can be sparsified in the sense that only a small fraction of the training points is

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Some Results on Inhomogeneous Discriminants

We study generalized Horn-Kapranov rational parametrizations of inhomogeneous sparse discriminants from both a theoretical and an algorithmic perspective. We show that all these discriminantal parametrizations are birational, we prove some results on the corresponding implicit equations, and we propose a combinatorial algorithm to compute their degree in the uniform case of (co)dimension 3.

متن کامل

Kernel second-order discriminants versus support vector machines

Support vector machines (SVMs) are the most well known nonlinear classifiers based on the Mercer kernel trick. They generally leads to very sparse solutions that ensure good generalization performance. Recently Mika et al. have proposed a new nonlinear technique based on the kernel trick and the Fisher criterion: the nonlinear kernel Fisher discriminant (KFD). Experiments show that KFD is compe...

متن کامل

Boosted Dyadic Kernel Discriminants

We introduce a novel learning algorithm for binary classification with hyperplane discriminants based on pairs of training points from opposite classes (dyadic hypercuts). This algorithm is further extended to nonlinear discriminants using kernel functions satisfying Mercer’s conditions. An ensemble of simple dyadic hypercuts is learned incrementally by means of a confidence-rated version of Ad...

متن کامل

Classification and Reductio-ad-Absurdurn

Proofs for the optimality of classification in real-world machine learning situations are constructed. The validity of each proof requires reasoning about the probability of certain subsets of feature vectors. It is shown that linear discriminants classify by making the least demanding assumptions on the values of these probabilities. This enables measuring the confidence of classification by l...

متن کامل

An efficient tree decomposition method for permanents and mixed discriminants

We present an efficient algorithm to compute permanents, mixed discriminants and hyperdeterminants of structured matrices and multidimensional arrays (tensors). We describe the sparsity structure of an array in terms of a graph, and we assume that its treewidth, denoted as ω, is small. Our algorithm requires Õ(n 2) arithmetic operations to compute permanents, and Õ(n + n 3) for mixed discrimina...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007